Towards Automated Extraction of Tempo Parameters from Expressive Music Recordings
نویسندگان
چکیده
A performance of a piece of music heavily depends on the musician’s or conductor’s individual vision and personal interpretation of the given musical score. As basis for the analysis of artistic idiosyncrasies, one requires accurate annotations that reveal the exact timing and intensity of the various note events occurring in the performances. In the case of audio recordings, this annotation is often done manually, which is prohibitive in view of large music collections. In this paper, we present a fully automatic approach for extracting temporal information from a music recording using score-audio synchronization techniques. This information is given in the form of a tempo curve that reveals the relative tempo difference between an actual performance and some reference representation of the underlying musical piece. As shown by our experiments on harmony-based Western music, our approach allows for capturing the overall tempo flow and for certain classes of music even finer expressive tempo nuances.
منابع مشابه
Extracting Expressive Tempo Curves from Music Recordings
Musicians give a piece of music their personal touch by continuously varying tempo, dynamics, and articulation. Instead of playing mechanically they speed up at some places and slow down at others in order to shape the piece of music. Similarly, they continuously change the sound intensity and stress certain notes. The automated analysis of different interpretations, also referred to as perform...
متن کاملCUEX: An Algorithm for Automatic Extraction of Expressive Tone Parameters in Music Performance from Acoustic Signals
CUEX is an algorithm that from recordings of solo music performances extracts the tone parameters for tempo, sound level, articulation, onset velocity, spectrum, vibrato rate, and vibrato extent. The aim is to capture the expressive variations in a music performance, rather than to identify the musical notes played. The CUEX algorithm uses a combination of traditional methods to segment the aud...
متن کاملExpressive modifications of Musical audio recordings: Preliminary Results
A system is described that aims to modify the performance of a musical recording (classical music) by changing the basic performance parameters tempo, sound level and tone duration. The input audio file is aligned with the corresponding score, which also contains extra information defining rule-based modifications of these parameters. The signal is decomposed using analysis-synthesis techniques...
متن کاملCorrelated microtiming deviations in jazz and rock music
Musical rhythms performed by humans typically show temporal fluctuations. While they have been characterized in simple rhythmic tasks, it is an open question what is the nature of temporal fluctuations, when several musicians perform music jointly in all its natural complexity. To study such fluctuations in over 100 original jazz and rock/pop recordings played with and without metronome we deve...
متن کاملWhat Makes Beat Tracking Difficult? A Case Study on Chopin Mazurkas
The automated extraction of tempo and beat information from music recordings is a challenging task. Especially in the case of expressive performances, current beat tracking approaches still have significant problems to accurately capture local tempo deviations and beat positions. In this paper, we introduce a novel evaluation framework for detecting critical passages in a piece of music that ar...
متن کامل